China Telecom Develops First Domestic AI Models Using Huawei Chips and MoE Architecture
China Telecom has unveiled the nation's first artificial intelligence models built on the Mixture-of-Experts (MoE) architecture, trained exclusively on Huawei's Ascend 910B chips. The TeleChat3 models, ranging from 105 billion to trillions of parameters, leverage Huawei's MindSpore framework, marking a significant milestone in China's push for self-reliance in AI infrastructure.
MoE architecture, popularized by DeepSeek's V3 model in late 2024, enables scalable capacity without proportional computational costs. While China Telecom's offering trails OpenAI's GPT-OSS-120B in performance, the achievement demonstrates growing technical sophistication in domestic AI development. The project addresses critical bottlenecks in large-scale model training within China's computational ecosystem.